11 research outputs found

    A Tutorial on Sparse Gaussian Processes and Variational Inference

    Full text link
    Gaussian processes (GPs) provide a framework for Bayesian inference that can offer principled uncertainty estimates for a large range of problems. For example, if we consider regression problems with Gaussian likelihoods, a GP model enjoys a posterior in closed form. However, identifying the posterior GP scales cubically with the number of training examples and requires to store all examples in memory. In order to overcome these obstacles, sparse GPs have been proposed that approximate the true posterior GP with pseudo-training examples. Importantly, the number of pseudo-training examples is user-defined and enables control over computational and memory complexity. In the general case, sparse GPs do not enjoy closed-form solutions and one has to resort to approximate inference. In this context, a convenient choice for approximate inference is variational inference (VI), where the problem of Bayesian inference is cast as an optimization problem -- namely, to maximize a lower bound of the log marginal likelihood. This paves the way for a powerful and versatile framework, where pseudo-training examples are treated as optimization arguments of the approximate posterior that are jointly identified together with hyperparameters of the generative model (i.e. prior and likelihood). The framework can naturally handle a wide scope of supervised learning problems, ranging from regression with heteroscedastic and non-Gaussian likelihoods to classification problems with discrete labels, but also multilabel problems. The purpose of this tutorial is to provide access to the basic matter for readers without prior knowledge in both GPs and VI. A proper exposition to the subject enables also access to more recent advances (like importance-weighted VI as well as interdomain, multioutput and deep GPs) that can serve as an inspiration for new research ideas

    Surrogate modeling with sequential design for design and analysis of electronic systems

    No full text
    The growing computational demands of modern engineering simulations as used frequently in fields ranging from computational fluid dynamics to electromagnetics, requires methodologies to be able to perform evaluation intensive tasks. Popular analyses include design space exploration, visualization, optimization or sensitivity analysis. This work provides an overview of advancements in surrogate modeling, a data-driven approximation technique. Both sequential design and adaptive modeling are covered, and an integrated platform for surrogate modeling is presented. Finally, a recent technique known as deep Gaussian processes is highlighted as a promising alternative for surrogate modeling of non-stationary response surfaces

    Hierarchical Gaussian process models for improved metamodeling

    No full text
    Simulations are often used for the design of complex systems as they allow one to explore the design space without the need to build several prototypes. Over the years, the simulation accuracy, as well as the associated computational cost, has increased significantly, limiting the overall number of simulations during the design process. Therefore, metamodeling aims to approximate the simulation response with a cheap to evaluate mathematical approximation, learned from a limited set of simulator evaluations. Kernel-based methods using stationary kernels are nowadays widely used. In many problems, the smoothness of the function varies in space, which we call nonstationary behavior [20]. However, using stationary kernels for nonstationary responses can be inappropriate and result in poor models when combined with sequential design. We present the application of two recent techniques: Deep Gaussian Processes and Gaussian Processes with nonstationary kernel, which are better able to cope with these difficulties. We evaluate the method for nonstationary regression on a series of real-world problems, showing that these recent approaches outperform the standard Gaussian Processes with stationary kernels. Results show that these techniques are suitable for the simulation community, and we outline the variational inference method for the Gaussian Process with nonstationary kernel
    corecore